Sparse, guided feature connections in an Abstract Deep Network

نویسندگان

  • Anthony Knittel
  • Alan D. Blair
چکیده

We present a technique for developing a network of re-used features, where the topology is formed using a coarse learning method, that allows gradient-descent fine tuning, known as an Abstract Deep Network (ADN). New features are built based on observed co-occurrences, and the network is maintained using a selection process related to evolutionary algorithms. This allows coarse exploration of the problem space, effective for irregular domains, while gradient descent allows precise solutions. Accuracy on standard UCI and Protein-Structure Prediction problems is comparable with benchmark SVM and optimized GBML approaches, and shows scalability for addressing large problems. The discrete implementation is symbolic, allowing interpretability, while the continuous method using fine-tuning shows improved accuracy. The binary multiplexer problem is explored, as an irregular domain that does not support gradient descent learning, showing solution to the benchmark 135-bit problem. A convolutional implementation is demonstrated on image classification, showing an error-rate of 0.79% on the MNIST problem, without a pre-defined topology. The ADN system provides a method for developing a very sparse, deep feature topology, based on observed relationships between features, that is able to find solutions in irregular domains, and initialize a network prior to gradient descent learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Link Prediction using Network Embedding based on Global Similarity

Background: The link prediction issue is one of the most widely used problems in complex network analysis. Link prediction requires knowing the background of previous link connections and combining them with available information. The link prediction local approaches with node structure objectives are fast in case of speed but are not accurate enough. On the other hand, the global link predicti...

متن کامل

An efficient method for cloud detection based on the feature-level fusion of Landsat-8 OLI spectral bands in deep convolutional neural network

Cloud segmentation is a critical pre-processing step for any multi-spectral satellite image application. In particular, disaster-related applications e.g., flood monitoring or rapid damage mapping, which are highly time and data-critical, require methods that produce accurate cloud masks in a short time while being able to adapt to large variations in the target domain (induced by atmospheric c...

متن کامل

Deep Adaptive Network: An Efficient Deep Neural Network with Sparse Binary Connections

—Deep neural networks are state-of-the-art models for understanding the content of images, video and raw input data. However, implementing a deep neural network in embedded systems is a challenging task, because a typical deep neural network, such as a Deep Belief Network using 128×128 images as input, could exhaust Giga bytes of memory and result in bandwidth and computing bottleneck. To addre...

متن کامل

Fiber Orientation Estimation Guided by a Deep Network

Diffusion magnetic resonance imaging (dMRI) is currently the only tool for noninvasively imaging the brain's white matter tracts. The fiber orientation (FO) is a key feature computed from dMRI for tract reconstruction. Because the number of FOs in a voxel is usually small, dictionary-based sparse reconstruction has been used to estimate FOs. However, accurate estimation of complex FO configurat...

متن کامل

Group sparse regularization for deep neural networks

In this paper, we consider the joint task of simultaneously optimizing (i) the weights of a deep neural network, (ii) the number of neurons for each hidden layer, and (iii) the subset of active input features (i.e., feature selection). While these problems are generally dealt with separately, we present a simple regularized formulation allowing to solve all three of them in parallel, using stan...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1412.4967  شماره 

صفحات  -

تاریخ انتشار 2014